Back

BMC Medicine

Springer Science and Business Media LLC

Preprints posted in the last 30 days, ranked by how well they match BMC Medicine's content profile, based on 163 papers previously published here. The average preprint has a 0.23% match score for this journal, so anything above that is already an above-average fit.

1
Effect mechanisms of different malaria chemoprevention regimens in pregnancy on infant growth outcomes: causal mediation analysis of a randomized controlled trial

Nguyen, A. T.; Nankabirwa, J. I.; Kakuru, A.; Roh, M. E.; Aguti, M.; Adrama, H.; Kizza, J.; Olwoch, P.; Kamya, M. R.; Dorsey, G.; Jagannathan, P.; Benjamin-Chung, J.

2026-04-25 public and global health 10.64898/2026.04.17.26351121 medRxiv
Top 0.1%
32.9%
Show abstract

Introduction: Intermittent preventive treatment in pregnancy (IPTp) with sulfadoxine-pyrimethamine (SP) has become less effective at preventing malaria due to rising parasite resistance. IPTp with dihydroartemisinin-piperaquine (DP) alone or in combination with SP (DP+SP) dramatically lowers the risk of malaria in pregnancy compared to SP but is associated with lower birthweight and early life wasting. We estimated the effect of IPTp-DP, DP+SP, and SP on infant growth outcomes and assessed possible treatment mechanisms through a causal mediation analysis. Methods: We used infant follow-up data (N=761) from a trial (NCT04336189) that randomized pregnant women to receive monthly IPTp-DP, SP, or DP+SP. We compared weight-for-length (WLZ) and length-for-age (LAZ) z-scores between treatment arms. We assessed possible mediation through pregnancy, birth, and infancy factors using interventional indirect effect models. Results: Compared to IPTp-SP, IPTp-DP+SP decreased mean WLZ by 0.18 [95% confidence interval (CI) -0.03, 0.39] between 1-3 months and 0.28 (95% CI 0.07, 0.49) between 4-6 months, with the largest differences among primigravidae. Lower risk of active placental malaria in IPTp-DP+SP helped reduce differences in mean WLZ vs IPTp-SP (+0.06, 95% CI 0.02, 0.10). The IPTp-DP+SP arm had up to 0.28 lower mean LAZ between 7-13 months compared to IPTp-DP, particularly among children who were wasted between 0-6 months; low birthweight had a persistent, mediating effect on linear growth. Conclusion: Adverse birth outcomes contributed to early growth faltering among children born to mothers receiving IPTp-DP+SP vs IPTp-SP, but the prevention of placental malaria partially counteracted the negative effects of IPTp-DP+SP on ponderal growth.

2
The association between severity and aetiology of chronic liver disease and seasonal influenza vaccination uptake in adults: a retrospective cohort study using English primary care data (2019-2024)

Haeusler, I. L.; Etoori, D.; Campbell, C. N. J.; McDonald, S. L. R.; Lopez Bernal, J.; Mounier-Jack, S.; Kasstan-Dabush, B.; McDonald, H. I.; Parker, E. P. K.; Suffel, A.

2026-04-11 public and global health 10.64898/2026.04.08.26350434 medRxiv
Top 0.1%
26.6%
Show abstract

BackgroundIn England, individuals with chronic liver disease (CLD) are among those with the lowest seasonal influenza vaccine uptake despite being at elevated risk of severe influenza. We examined the relationship between CLD severity and aetiology, and influenza vaccine uptake in England. MethodsA retrospective cohort study of adults (18-115 years) using Clinical Practice Research Datalink Aurum primary care data was conducted for five seasons (2019/20-2023/24). Poisson regression was used to estimate rates of uptake by CLD severity (clinical diagnoses categorised as low, moderate, or severe) and aetiology (alcohol-related, viral-related, and diagnoses in the Green Book guidelines). FindingsThere were 182,174-277,470 with CLD per cohort. Among those who were additionally age-eligible for vaccination, uptake was 71{middle dot}1-79{middle dot}7% compared to 30{middle dot}9-40{middle dot}5% in those not additionally age-eligible. Among individuals below age eligibility without other comorbidities, severity was associated with higher uptake (incidence rate ratio [IRR] moderate 1{middle dot}80, 95% CI 1{middle dot}69-1{middle dot}90; severe 1{middle dot}95, 95% CI 1{middle dot}84-2{middle dot}08 in 2023/24); there was no effect in those with at least one additional comorbidity (moderate 1{middle dot}05, 95% CI 0{middle dot}99-1{middle dot}10; severe 1{middle dot}05, 95% CI 1{middle dot}01-1{middle dot}09). Alcohol- and viral-related aetiology were also associated with increased uptake in those not additionally age-eligible. Among individuals meeting age eligibility without additional comorbidities, severity was associated with a reduced uptake (moderate 0{middle dot}81, 95% CI 0{middle dot}73-0{middle dot}90; severe 0{middle dot}79, 95% CI 0{middle dot}74-0{middle dot}85), with attenuation in those with additional comorbidities (moderate 0{middle dot}99, 95% CI 0{middle dot}94-1{middle dot}04; severe 0{middle dot}91, 95% CI 0{middle dot}89-0{middle dot}94). InterpretationCLD severity and aetiology were important determinants of uptake in the absence of additional indications for influenza vaccination. Future research should prioritise understanding facilitators and barriers to vaccine uptake in individuals with CLD, particularly for those at highest risk of severe infection. FundingNIHR Health Protection Research Unit in Vaccines and Immunisation (NIHR200929/NIHR207408). Research in contextO_ST_ABSEvidence before this studyC_ST_ABSWe searched PubMed up to June 2025 using the terms "chronic liver disease", "cirrhosis", "hepatitis", "influenza vaccination", "seasonal influenza", and "vaccine uptake". Previous research, including national data from England, has shown that people with chronic liver disease tend to have lower seasonal influenza vaccine uptake than individuals with other medical comorbidities which qualify for vaccination such as diabetes, chronic kidney disease or immunosuppression. The reasons for low influenza vaccine uptake in people with chronic liver disease are not well understood, and it is therefore difficult for vaccination providers, principally primary care services in England, to tailor interventions aimed to increase uptake. Qualitative research involving individuals aged less than 65 years living in England with clinical risk comorbidities, most commonly diabetes, found that chronic disease management pathways inconsistently provided information about the importance of influenza vaccination as part of chronic disease management. Individuals with long-term conditions reported low perceived risk of influenza infection and limited awareness of vaccine benefits as important reasons for non-uptake. We hypothesised that the severity and aetiology of chronic liver disease may be important determinants of uptake. Added value of this studyWe conducted a population-based study to examine how chronic liver disease severity and aetiology influence seasonal influenza vaccine uptake in adults in England. Using primary care electronic health record data from five consecutive influenza seasons (2019/20-2023/24), we found that more severe chronic liver disease was associated with a substantial increase in vaccine uptake in those without additional indications for seasonal influenza vaccination (age-based eligibility or other qualifying clinical risk comorbidities). Alcohol- and viral-related aetiology were also associated with increased uptake in those who were not additionally age-eligible for vaccination. In contrast, severity, alcohol- and viral-related underlying aetiology were associated with a modest reduction in uptake for individuals with chronic liver disease who also qualified for vaccination due to age. Implications of all the available evidenceDespite clear clinical vulnerability to infection and a substantially elevated risk of morbidity and mortality following infection, a large proportion of adults with chronic liver disease, particularly those aged under 65 years, remain unvaccinated against seasonal influenza each year. This study suggests that chronic liver disease severity and underlying aetiology are important determinants of uptake in individuals not meeting age-based vaccine eligibility, particularly in those without additional clinical risk comorbidities. This could be because of differing perceptions of influenza risk, or due to varying degrees of interaction with healthcare specialists as part of chronic disease management. In individuals who met age-based vaccination eligibility, the negative effect of severity on influenza vaccine uptake may reflect greater barriers to accessing vaccination services by those with more complex health needs, or competing medical priorities for long-term condition management during consultations. To inform targeted vaccination strategies, future research should aim to understand the specific facilitators and barriers to influenza vaccination experienced by individuals with chronic liver disease. This should include perspectives of individuals with different disease severity, across different age groups, in those with and without additional co-morbidities.

3
Practical alcohol risk-reduction advice plus a brief commitment declaration in a social drinking laboratory: a pilot cluster randomized trial

Yoshimoto, H.; Hadano, T.; Shimada, K.; Gosho, M.; Fukuda, T.; Komano, Y.; Umeda, K.; Iwase, M.; Kusano, Y.; Kawabata, T.

2026-04-21 public and global health 10.64898/2026.04.19.26351067 medRxiv
Top 0.1%
17.8%
Show abstract

BackgroundPractical alcohol risk-reduction strategies are widely recommended in public-facing alcohol guidance, but randomized evidence from socially interactive drinking episodes remains limited. We conducted a pilot cluster randomized trial to evaluate the feasibility and preliminary effects of a package intervention comprising practical drinking-strategy information, participant self-selection of same-day strategies, and a brief commitment declaration in a social drinking laboratory. MethodsThis single-center, parallel-group pilot trial was conducted in Japan. Pre-existing social groups participated. One or two groups scheduled in the same session slot were combined into a time-slot allocation unit, which was randomized 1:1 either to the package intervention or to alcohol-related knowledge only. The primary outcome was total pure alcohol intake during the first 120 min. Session satisfaction on a Visual Analog Scale (VAS) was a prespecified secondary participant-experience outcome. ResultsOf 83 interested individuals, 63 were randomized and 59 participants in 17 social groups and 12 allocation units were included in the modified intention-to-treat analysis. The mean paired intervention-control difference for 120-min alcohol intake was-8.84 g (95% confidence interval [CI]-27.92 to 10.23; exact sign-flip p = 0.281). The corresponding exploratory 0-30 min difference was-4.90 g (95% CI-10.48 to 0.68; exact sign-flip p = 0.094). In a genotype-adjusted participant-level sensitivity analysis, the intervention coefficient for 120-min intake was-16.0 g (95% CI-30.9 to-1.1; p = 0.036). Session satisfaction was high in both arms with no clear between-arm difference. Next-day follow-up was 100%, and no adverse-event-related discontinuations occurred. ConclusionsThe intervention was feasible to deliver in a socially interactive drinking setting, and session satisfaction was high in both arms. Primary allocation-unit estimates favored lower alcohol intake but were imprecise. Larger trials are needed to estimate effects more precisely, while considering the potential influence of genotype imbalance on effect estimation in East Asian samples. Trial registrationUniversity Hospital Medical Information Network Clinical Trials Registry (UMIN-CTR) UMIN000060685. Registered 17 February 2026.

4
Accelerometer-derived circadian rhythm and colorectal cancer risk in UK Biobank: a prospective cohort study

Ni Chan Chin, M.; Berrio, J. A.

2026-04-05 oncology 10.64898/2026.04.03.26350124 medRxiv
Top 0.1%
14.3%
Show abstract

Abstract Background: While total physical activity is a recognized modifier of cancer risk, accelerometer-derived digital phenotyping enables high-resolution mapping of circadian behavior. Whether these multidimensional patterns comprising step counts, sleep, physical activity, circadian rhythmicity, and light exposure independently influence the risk of incident colorectal cancer (CRC) has not been comprehensively evaluated Methods: We performed an exposure-wide association study (ExWAS) of 224 accelerometer-derived metrics among 95,050 UK Biobank participants who were free of CRC at accelerometry. To comprehensively define circadian rhythm patterns, we systematically categorized these metrics into five core behavioral domains: step counts, sleep architecture, physical activity bouts, circadian rhythmicity, and light exposure. Hazard ratios (HRs) and 95% confidence intervals were estimated using Cox proportional hazards models with age as the underlying timescale. Results: During a median follow-up of 8.5 years, 775 participants developed CRC (503 colon; 269 rectal). In minimally adjusted models, 121 metrics showed nominal significance (31 for overall CRC, 89 for colon, and 1 for rectal cancer). Protective associations were predominantly observed for metrics characterizing activity intensity and bout structure; notably, higher mean acceleration during 5-10 minute bouts of moderate-to vigorous physical activity was associated with reduced CRC risk (HR 0.88 per SD). In contrast, no metrics within the defined sleep or light exposure domains reached nominal significance. These associations attenuated substantially following progressive adjustment for lifestyle and metabolic covariates, suggesting potential confounding or shared biological pathways. Conclusions: Our findings identified specific behavioral phenotypes within a multidimensional framework of circadian rhythm, including step counts, physical activity intensity, and bout structure, as being associated with CRC risk. However, the marked attenuation of signals after multivariable adjustment suggests these markers may not serve as independent predictors. These results underscore the complexity of multidimensional circadian digital biomarkers and necessitate independent replication to clarify their utility in cancer risk stratification.

5
Defining the potential impact and cost-effectiveness of a non-invasive diagnostic for malaria: a modeling study

Hansen, M. A.; de Nooy, A.; Calarco, S.; Tetteh, K. K.; Nichols, B. E.

2026-04-01 health economics 10.64898/2026.03.31.26349813 medRxiv
Top 0.1%
14.3%
Show abstract

Background Malaria rapid diagnostic tests (RDTs) are widely used to detect and treat malaria infections, yet a diagnostic gap remains. With turnaround times of ~15 minutes, RDTs may be too slow to enable broad-scale implementation in certain contexts. Novel non-invasive diagnostics (NIDs) have potential to provide faster (<5 minutes), sensitive (90% for symptomatic, 65% for asymptomatic carriage), and cost-effective alternatives, which may increase testing throughput, enhance case detection, guide appropriate antimicrobial use, and reduce waste by using fewer consumables. Their potential impact has yet to be investigated. Methods We modeled a country-agnostic population of 10 million individuals to assess the impact of population-level scale-up of four malaria testing strategies for active case-finding: 1) current practice (50% syndromic diagnosis and 50% RDTs), 2) full RDT scale-up, 3) full NID implementation, and 4) NID screening plus confirmatory RDT, using a decision-tree model of the malaria diagnostic and care cascade. We varied prevalence (0.02-0.25) and proportion of cases with symptoms (0.05-0.60) to evaluate strategy performance across epidemiological contexts. We investigated case detection rates, antimicrobial use, incremental cost-effectiveness ratios (ICERs) per disability adjusted life year (DALY) averted, net positive treatment outcomes, and threshold performance levels at which an NID would outperform RDTs. Results Full NID implementation (strategy 3) yielded the highest case detection rates (up to 85%), followed by strategies 2, 4, and 1 (45%, 38%, 36% respectively). NID-based methods (strategies 3 and 4) saved costs and RDT scale-up was cost-effective at averting DALYs compared to current practice (ICERs: $60-1,270). Despite high case detection, universal NID testing spiked unnecessary antimicrobial use. Overall, our results suggest that an NID with 55% asymptomatic sensitivity and 84% specificity, followed by RDT confirmation (strategy 4), could simultaneously improve case detection, reduce antimicrobial overuse, and limit costs. Conclusions This modeling analysis suggests that NIDs can sustainably optimize malaria case detection in symptomatic and asymptomatic cases and reduce costs, potentially making them a valuable addition to the diagnostic toolbox. When paired with confirmatory RDTs, they could help reduce inappropriate antimicrobial use, supporting drug efficacy amid rising resistance. Further research should assess their real-world utility, feasibility, and scalability for malaria surveillance and elimination efforts.

6
High-dimensional multiomics reveals perturbations to IL-6/IL-6R axis and RUNX3 in CD4+ T cells during third trimester pregnancy

Habel, J.; Nguyen, T. H. O.; de Alwis, N.; Allen, E. K.; Li, S.; Juno, J. A.; Kent, S. J.; Bond, K.; Williamson, D.; Lappas, M.; Hannan, N.; Walker, S.; Schroeder, J.; Crawford, J. C.; Thomas, P.; Kedzierska, K.; Rowntree, L.

2026-03-30 immunology 10.64898/2026.03.26.711478 medRxiv
Top 0.1%
12.8%
Show abstract

ObjectivesCD4+ T cells play key roles in regulating immune responses during pregnancy, therefore we aimed to understand the CD4+ T cell surface proteome and transcriptome during pregnancy. MethodsCD4+ T cells were analysed in blood and decidua from term-pregnancies (>37 weeks), and non-pregnant blood. >350 surface proteins were screened via flow cytometry, and transcriptomes were analysed using single-cell RNA sequencing with >130 CITE-seq barcoded antibodies. ResultsSurface protein screening identified changes to ILT4/CD85d, CD9, IFN-{gamma} receptor {beta}-chain, CX3CR1 and CCR5 in the pregnant blood and decidual CD4+ T cells. CX3CR1 and CCR5 had the highest expression on the effector-memory T cell (TEM) subset in the blood, with expression consistent across subsets in decidua. CD126/IL-6R was lower in pregnant blood and decidual CD4+ T cells, while scRNAseq identified enrichment in the IL-6R signalling pathway in naive CD4+ T cells in pregnant blood. Both sIL-6R and IL-6 concentrations were increased in plasma during pregnancy, suggesting perturbations to the IL-6/IL-6R signalling axis. Meanwhile, decidual CD4+ T cells had increased expression of transcription factor RUNX3 in the CD69+ tissue-resident-like subset. ConclusionsOur findings demonstrate altered molecular expression in CD4+ T cells during pregnancy. This provides important mechanistic insight of their adaptation and regulation during placental development, which may drive placental dysfunction or pregnancy complications including preeclampsia, fetal growth restriction and stillbirth. These new data may inform future studies that focus on determining the significance of differentially- expressed immune features in pregnancy to identify potential targets for immune modulation to treat pregnancy complications and infections.

7
Spine-Related Health Care Utilization and Costs Following Orthobiologic Injection Versus Lumbar Surgery for Degenerative Spine Conditions

Lentz, T.; Burrows, J.; Brucker, A.; Wong, A. I.; Qualls, L.; Divakaran, R.; Centeno, C.; Suther, T.; Thomas, L.

2026-04-02 orthopedics 10.64898/2026.03.31.26349877 medRxiv
Top 0.1%
12.7%
Show abstract

Background Lumbar fusion and decompression procedures are widely used for degenerative spine conditions but are associated with substantial health care costs and variable outcomes. Orthobiologic treatments, including platelet rich plasma (PRP) and bone marrow aspirate concentrate (BMAC), have emerged as less invasive options for select patients who meet surgical criteria. However, concerns remain that orthobiologic care may delay rather than avert surgery, potentially increasing downstream utilization and costs. Comparative evidence on real world utilization and costs is limited. Methods We conducted a retrospective, observational study using linked commercial insurance claims and a national orthobiologic treatment registry. Adults with lumbar degenerative disc disease (DDD) who met criteria for lumbar fusion or laminectomy, foraminotomy, discectomy, and facetectomy (LFDF) procedures, and who received PRP injection (with or without BMAC) or surgery between 2016 and 2023 were included. Two comparisons were evaluated: PRP versus lumbar fusion and PRP versus lumbar decompression procedures. Propensity score matching was used to balance cohorts on demographic characteristics, comorbidities, spine related diagnoses, prior health care use, and severity proxies. Outcomes included spine-related health care resource use and aggregate costs at 12 and 24 months, with exploratory analyses at 36 and 48 months. Costs were estimated using multiple approaches, including Medicare based estimates and commercial payer methods. Results After matching, 133 patients receiving PRP were compared with 2,560 patients undergoing fusion, and 198 patients receiving PRP were compared with 3,960 patients undergoing LFDF. Rates of subsequent spine surgery following PRP were low and below cell suppression thresholds through 24 months, with similar findings in exploratory longer-term analyses. Compared with surgical cohorts, patients receiving PRP had lower rates of postoperative imaging, home health services, and outpatient visits, with no consistent differences in opioid use, magnetic resonance imaging, or physical therapy. At 12 and 24 months, mean aggregate costs were significantly higher for fusion and LFDF cohorts across most costing methods. Cost differences were largest for fusion comparisons and were driven primarily by index procedure costs and higher reoperation and imaging rates in surgical cohorts. Findings were generally consistent across sensitivity and exploratory analyses. Conclusions Among select patients with degenerative spine conditions who meet surgical criteria, PRP was associated with lower health care utilization and substantially lower costs compared with lumbar fusion or LFDF, without evidence of increased progression to surgery. These findings support consideration of orthobiologic options for appropriately selected patients when surgery is not the only viable treatment option. Limitations include selection bias, absence of patient reported outcomes, and claims-based severity measures.

8
Integrative Multi-cohort Transcriptomics and Network Pharmacology Analysis Reveals Key Network Nodes and Potential Drug Clues in PCOS Granulosa Cells

Zhang, X.; Fang, J.; Liu, Z.; Li, S.; Jin, F.; Guo, L.; Qiang, R.; Zhu, Y.; Hou, T.; Li, J.; Liu, Y.

2026-04-06 systems biology 10.64898/2026.04.01.715808 medRxiv
Top 0.1%
12.5%
Show abstract

BackgroundPolycystic ovary syndrome (PCOS) is a prevalent endocrine disorder with complex pathophysiology and limited therapeutic options. Identifying key molecular drivers and potential drug candidates is critical for improving clinical outcomes. MethodsWe integrated multi-cohort transcriptomics (GSE155489, GSE138518, GSE226146) with weighted gene co-expression network analysis (WGCNA), protein-protein interaction (PPI) network analysis, and drug repurposing. Differential expression analysis identified 1,039 DEGs, and WGCNA identified 10 PCOS-associated modules. Intersection of DEGs with module genes yielded 498 core candidate genes, which were subjected to functional enrichment, PPI network analysis, and connectivity map-based drug repurposing (CLUE/LINCS). Candidate drugs were further evaluated by molecular docking and ADMET prediction using a triple intersection strategy (hub genes, high differential expression, drug-target evidence). ResultsFunctional enrichment revealed significant enrichment in cell adhesion and TGF-beta signaling. PPI network analysis identified CD44 as the top hub gene (degree=42). Drug repurposing identified 106 candidate drugs, including troglitazone and enzalutamide. Using the triple intersection strategy, five genes (ID2, NR4A1, GJA5, ID1, MYH11) were prioritized for molecular docking. GJA5 showed strong predicted binding affinity with flufenamic acid (-7.88 kcal/mol), and cytosporone B exhibited favorable druglikeness (0 Lipinski violations). ConclusionThis study systematically characterizes PCOS-associated gene networks and provides a prioritized set of candidate targets and drugs through a purely computational framework. CD44 emerges as a key network node with potential relevance in PCOS pathophysiology. These findings offer testable hypotheses for future mechanistic studies and drug discovery efforts in PCOS.

9
Ethnic inequalities in respiratory virus epidemics in England: a mathematical modelling study

Robert, A.; Goodfellow, L.; Pellis, L.; van Leeuwen, E.; Edmunds, W. J.; Quilty, B. J.; van Zandvoort, K.; Eggo, R. M.

2026-04-21 infectious diseases 10.64898/2026.04.18.26350858 medRxiv
Top 0.1%
12.0%
Show abstract

BackgroundIn England, the burden of respiratory infections varies by ethnicity, contributing to health inequalities, but the role of additional demographic factors remains underexplored. We quantified how differences in social mixing and demographic characteristics between ethnic groups cause inequalities in transmission dynamics. MethodsWe analysed the association between the ethnicity and the number of contacts of 12,484 participants in the 2024-2025 Reconnect social contact survey, using a negative binomial regression model. We simulated respiratory pathogen epidemics using a compartmental model stratified by age, ethnicity, and contact levels, at a national level and in major cities in England. FindingsAfter adjusting for demographic variables, participants of Black and Mixed ethnicities had more contacts than those of White ethnicity (rate ratios (RR): 1.18 [95% Credible Interval (CI): 1.11-1.26], and 1.31 [95% CI: 1.14-1.52]). Participants of Asian ethnicity had fewer contacts (RR: 0.85 [95% CI: 0.79-0.91]). In national-level simulations, individuals of White ethnicity had the lowest attack rates due to demographic differences and mixing patterns. Local demographic structures changed simulated dynamics: attack rates in individuals of Black and Mixed ethnicities were approximately double those of White ethnicity in Birmingham, but less than 60% higher in Liverpool. InterpretationDemographic characteristics and mixing patterns create inequalities in transmission dynamics between ethnicities, while local demographic characteristics and pathogen infectiousness change the expected relative burden. To ensure mitigation strategies are effective and equitable, their evaluation must explicitly account for inequalities arising from local context. FundingMedical Research Council, National Institute for Health and Care Research, Wellcome Trust Research in context Evidence before this studyWe searched PubMed for population-based studies quantifying differences in respiratory infections between ethnic groups, up to 1 April 2026, with no language restrictions. Keywords included: (respiratory pathogens OR influenza OR COVID-19) AND (ethnic* OR race) AND (inequ*) AND (compartmental model OR incidence rate ratio OR hazard ratio). We excluded studies that focused on non-respiratory pathogens (e.g. looking at consequences of COVID-19 on incidence of other pathogens). A population-based cohort study showed that influenza infection risk was higher in South Asian, Black, and Mixed ethnic groups compared to White ethnicity in England. Another population-based cohort study highlighted that during the first wave of COVID-19 in England, the South Asian, Black, and Mixed ethnic groups were more likely to test positive and to be hospitalised than the White ethnic group. Census data in England showed that the distributions of age, household size, household income and employment status differed between ethnic groups, and the recent Reconnect social contact surveys highlighted the impact of each demographic factor on the participants number of contacts. Added value of this studyOur study shows that social contact patterns, mixing, and demographic structure all lead to unequal infection risk between ethnic groups in respiratory pathogen epidemics. Using the largest available social contact survey in England, we show that both the average number of contacts and the proportion of high-contact individuals varied by ethnic group, even after adjusting for participants demographics. These differences, together with mixing patterns and age structure, led to lower expected incidence among individuals of White ethnicity than in all other ethnic groups in simulated outbreaks. The level of inequality between ethnic groups changed when we used different values of pathogen transmissibility. Finally, as ethnic composition and population structure differ between cities in England, our results show differences in expected inequalities at a local level. Implications of all the available evidenceInequalities in infection risk between ethnic groups are context- and pathogen-dependent. They arise from both local population structure and contact patterns. Detailed information on mixing between groups and population structure is needed to accurately measure group-specific infection risk. These findings indicate that public health interventions based only on national-level estimates conceal regional variation in risk and may ultimately increase inequalities. Public health interventions need to be tailored to local contexts to be equitable and effective. Finally, our findings provide a foundation for understanding the progression from infection-risk inequalities to disparities in disease presentation and clinical outcomes.

10
Neurogenic dysphagia as an independent driver of hospital length of stay and costs: a Bayesian analysis with geriatric stratification and intervention simulation

Werner, C. J.; Meyer, T.; Pinho, J.; Mall, B.; Schulz, J. B.; Schumann-Werner, B.

2026-04-10 health economics 10.64898/2026.04.08.26350417 medRxiv
Top 0.1%
10.7%
Show abstract

Purpose: Neurogenic dysphagia is prevalent in neurological inpatients and associated with adverse outcomes, yet its independent economic impact after adjustment for frailty and functional status remains poorly quantified. We aimed to estimate the independent effect of dysphagia on hospital length of stay (LOS) and costs, to test whether this effect differs between geriatric and non-geriatric patients, and to quantify the probability and magnitude of cost savings from improvements in swallowing function. Methods: We analysed 10,375 neurological inpatient cases (2021-2024) at a German university hospital. Dysphagia was defined by fiberoptic endoscopic evaluation of swallowing (FEES) or ICD-10 R13 coding (n = 1,382; 13.3%). Bayesian Gamma-log regression with informative priors from historical data and published literature was used to model LOS and total case costs (German DRG), adjusted for age, sex, Hospital Frailty Risk Score (HFRS, R13-adjusted), self-care index ("Selbstpflege-Index", SPI), stroke status, and emergency admission. A geriatric cohort was defined as age >=70 and adjusted HFRS >=5 (n = 2,053; 19.8%). Posterior predictive simulation estimated cost savings for hypothetical improvements of 1-3 points on the Functional Oral Intake Scale (FOIS). Results: After comprehensive adjustment, dysphagia was independently associated with 46.5% longer LOS (posterior ratio 1.465; 95% credible interval [CrI] 1.397-1.537) and 28.2% higher total case costs (ratio 1.282; CrI 1.213-1.354). The dysphagia x geriatric interaction was small but credible and ran in opposite directions: slightly attenuated for LOS (interaction ratio 0.908, CrI 0.837-0.986) but slightly amplified for costs (1.096, CrI 1.012-1.185), consistent with complexity-driven DRG grouping in geriatric patients. The absolute economic burden remained larger in the geriatric cohort due to higher baseline costs. In the geriatric cohort, a one-point FOIS improvement yielded a 74.3% posterior probability of LOS-based savings (mean EUR 555/case); at three points, this rose to 84.2% (mean EUR 1,115/case). The direct cost model confirmed high benefit probabilities from the payer's perspective (82.6% at dFOIS = 3). Conclusions: Neurogenic dysphagia is an independent and substantial driver of hospital LOS and costs in neurological inpatients, even after adjustment for frailty and functional status. The proportional effect on costs is slightly larger in geriatric patients, while the LOS effect is slightly smaller, consistent with the mechanics of the G-DRG system. Bayesian simulation indicates that improvements in swallowing function carry a high probability of generating cost savings, supporting the characterisation of dysphagia as a modifiable economic target with particular relevance to geriatric neurology.

11
Screening for prostate cancer using PSA with and without MRI: systematic reviews with meta-analysis

Pillay, J.; Gaudet, L. A.; Rahman, S.; Grad, R.; Theriault, G.; Dahm, P.; Todd, K. J.; Macartney, G.; Thombs, B.; Saba, S.; Hartling, L.

2026-03-31 primary care research 10.64898/2026.03.30.26349764 medRxiv
Top 0.1%
10.6%
Show abstract

Background: Previous recommendations on screening for prostate cancer relied on ongoing trials of screening with prostate-specific antigen (PSA), which may have lacked sufficient follow-up duration to fully examine effects on mortality and overdiagnosis. Findings which consider absolute effects by age and screening intensity, along with newer guidance for assessing evidence certainty, may lead to different interpretations. Adding magnetic resonance imaging (MRI) to PSA-based screening has been raised as a way to reduce false positives (FPs) and overdiagnosis. Methods: We systematically searched MEDLINE, Embase, and Central from 2014 to January 28, 2026, for randomized controlled trials (RCTs) and prospective observational studies of: (i) screening versus no screening and (ii) sequential screening with MRI for those with a positive PSA test versus PSA alone among men not known to be at high risk for prostate cancer. Studies on screening with PSA or digital rectal examination (DRE) published pre-2014 were identified from existing systematic reviews and reference lists. Studies on FPs and complications from biopsies after PSA screening did not require a control group. Paired reviewers screened titles/abstracts (assisted with artificial intelligence) and full texts, assessed risk of bias, and extracted data, by age when available. We pooled data when suitable using random-effects models, investigated heterogeneity, and assessed the certainty of evidence using GRADE with conclusions of effects based on decision thresholds based on absolute effect sizes. Results: Across both questions, we included 15 RCTs (N=856,000; 8 sites of ERSPC considered separate trials) and 8 observational studies (N=56,122). At 20 years, among 1000 men who underwent repeated PSA-based screening every 2-4 years starting from age 55-69 (mean 62), there is likely a reduction in prostate-cancer mortality ([&ge;]2 fewer) and metastatic cancer incidence ([&ge;]6 fewer), at the expense of prostate-cancer overdiagnosis ([&ge;]24 cases) and FPs ([&ge;]150 cases) (all moderate certainty). If screening starts at age 50-54 or age 55, the benefits are probably smaller (e.g., 1 vs. 2 fewer prostate-cancer related deaths) with similar harms. Adding DRE or screening with PSA annually does not add benefit. One round of PSA screening or starting screening later at age 70-74 may not offer any important benefit or harm (low to moderate certainty), and any benefit from screening primarily with DRE was not shown. Compared with PSA alone, sequential screening with PSA followed by MRI reduces FPs ([&ge;]33 fewer) and overdiagnosis (via [&ge;]10 fewer diagnoses of clinically insignificant [e.g., Gleason 6] cancers without impacting detection of clinically significant cancers) (moderate to high certainty), though findings were limited to one round of screening without long-term follow-up or measurement of mortality. Interpretation: This review provides clinicians and other interest holders with anticipated absolute effects by age, and assessments of certainty across critical and important outcomes and with approximately two decades of follow-up. Findings apply to a general population and may differ for specific groups. Results for most critical outcomes, both benefits and harms, exceeded thresholds for clinically important effect sizes, thereby demonstrating the complexity of guideline developers' and patients' decision-making regarding screening trade-offs. Findings about adding MRI for those with a positive PSA test were limited and would require additional consideration of costs, infrastructure, expertise, and equity. Protocol registration: PROSPERO - CRD420250651056.

12
Urine proteomic profiling at admission reveals complement biomarkers linked to alcohol-associated liver disease.

Prado, L. G.; Musich, R.; Taiwo, M.; Pathak, V.; Rotrof, D. M.; Bellar, A.; Welch, N.; Dasarathy, J.; Streem, D.; for the AlcHepNet, ; Dasarathy, S.; Nagy, L. E.

2026-04-07 immunology 10.64898/2026.04.04.716339 medRxiv
Top 0.1%
10.6%
Show abstract

Background and aimsCirculating complement is associated with occurrence of alcohol-associated hepatitis (AH) and is a potential biomarker to distinguish AH from alcohol cirrhosis (AC). Complement contributes to kidney injury, a condition often occurring in patients with alcohol-associated liver disease (ALD). However, little is known regarding complement in cross talk between liver and kidney in ALD. Here we tested the hypothesis that urinary complement would provide potential biomarkers for ALD and insights into mechanisms of liver-kidney crosstalk in the pathogenesis of ALD. MethodsPlasma and urine were collected at admission from patients with sAH, healthy controls (HC), and heavy drinkers without liver disease (HD) (from the multicenter Alcohol Hepatitis Network) and with AC (from the Northern Ohio Alcohol Center). Urine was subjected to unbiased proteomics analysis and plasma complement assessed by multiplex/ELISA assays. 30- and 90-day mortality was tracked in patients with sAH. ResultsAll three complement activation pathways were perturbed in plasma and urine of patients with sAH and AC compared to HC and HD. Components of the lectin and classical pathways in urine were associated with 30- and 90-day mortality in patients with sAH. When 4 complement proteins were combined, they distinguished sAH from AC (AUC 0.78), equivalent to that of MELD (AUC 0.65). There was no correlation between complement in plasma and urine, suggesting an independent impact of sAH on complement in kidney and liver. ConclusionThe urinary proteome revealed complement protein signatures associated with sAH and AC, providing valuable insights into the potential for complement biomarkers and the mechanisms of liver-kidney crosstalk in ALD.

13
Understanding inequalities in COVID-19 vaccination between migrants and non-migrants in Germany: The role of psychological factors of vaccine behaviour

Bartig, S.; Siegert, M.; Hoevener, C.; Michalski, N.

2026-04-17 public and global health 10.64898/2026.04.15.26350844 medRxiv
Top 0.1%
10.5%
Show abstract

Background: Understanding the underlying mechanisms for differences in vaccine uptake between migrants and non-migrants is crucial in order to design targeted interventions encouraging vaccination and to ensure vaccine-related equity. Therefore, this study examined to what extent migration-related disparities in COVID-19 vaccination were associated with psychological factors, based on the established 5C model of vaccine behaviour (Confidence, Complacency, Constraints, Calculation, Collective Responsibility). Methods: Data were obtained from the German study "Corona Monitoring Nationwide - Wave 2" (RKI-SOEP-2 study), which was carried out between November 2021 and March 2022. The association between COVID-19 vaccination and migration status, while considering the psychological factors, was investigated using multivariable binary logistic regressions. A decomposition analysis (Karlson-Holm-Breen method) was conducted to examine the extent to which migration-related disparities in vaccine uptake were associated with the psychological factors of the 5C framework. Results: Migrants were less likely to be vaccinated against COVID-19 compared to non-migrants, especially participants from the Middle East and North Africa (MENA) region. Our decomposition showed that almost two-thirds of the disparities in COVID-19 vaccine uptake between migrants and non-migrants were associated with the psychological factors (first-generation: 61.2%, second-generation: 64.2%). Confidence in safety of the vaccine was the most relevant factor in the 5C framework. Furthermore, the results highlighted the importance of a differentiated analysis regarding country of origin: While the 5C model accounted for only 19.4% of the difference between participants from the MENA region and non-migrants, the proportion for participants from Eastern Europe was 73.5%, suggesting that the underlying mechanisms for the lower uptake in the MENA group need further investigation. Conclusions: Overall, migration-related disparities in COVID-19 vaccination were significantly associated with differences in psychological factors of vaccine behaviour. To increase vaccine acceptance within the heterogeneous group of migrants in general, tailored and proactive health communication interventions are needed.

14
BMI and Varus Malalignment Compound to Define a High-Risk Phenotype for Compartment-Specific Knee Osteoarthritis Progression

White, M. S.; Kogan, F.; Delp, S. L.; Chu, C. R.; Sherman, S. L.; Pai S, A.; Gold, G. E.; Chaudhari, A. S.; Gatti, A. A.

2026-04-17 orthopedics 10.64898/2026.04.15.26350819 medRxiv
Top 0.2%
10.4%
Show abstract

Objectives: Knee osteoarthritis (KOA) is a leading cause of disability, yet which patients will experience structural decline remains unclear. Body mass index (BMI) and lower limb alignment are established risk factors for KOA, but their independent and interactive effects on compartment-specific cartilage loss and total knee replacement (TKR) have not been characterized at scale. Methods: We analyzed 5,832 limbs from 3,016 participants in the Osteoarthritis Initiative followed over 7 years. Cartilage thickness in the weight-bearing medial and lateral femur and tibia was quantified, and lower limb alignment was measured using hip-knee-ankle (HKA) angle obtained from full-limb radiographs. Linear mixed-effects models estimated the independent and interactive effects of BMI and lower limb alignment on longitudinal cartilage thinning, and mixed-effects logistic regression modeled TKR risk. Results: In the medial compartment, BMI and varus alignment interacted multiplicatively, with their combined effect exceeding the sum of independent contributions (femur: p = 0.011; tibia: p < 0.001). At +10 kg/m2 BMI and +10 degrees varus, the rate of medial femur cartilage thinning was 243.5% faster than the reference rate. In the lateral compartment, BMI and valgus alignment were independently associated with faster cartilage thinning, with no significant interaction. TKR risk increased exponentially with HKA deviation (odds ratio [OR] = 1.38 per 1 degree; ~five-fold at 5 degrees malalignment) but was not associated with BMI. Conclusion: BMI and lower limb alignment influence structural KOA progression through compartment-specific pathways. The multiplicative interaction in the medial compartment identifies high BMI combined with varus malalignment as a discrete high-risk phenotype, with implications for clinical risk stratification and disease-modifying intervention design.

15
Age-specific income losses due to HPV-attributable cancers in Singapore

Blythe, R.; Graves, N.; Iyer, N. G.; Peres, M. A.

2026-04-17 health economics 10.64898/2026.04.16.26351014 medRxiv
Top 0.2%
10.4%
Show abstract

Introduction The link between Human Papillomavirus (HPV) and cancer is well-established. In Singapore, bivalent HPV vaccines are subsidised for females, but not males. Economic analysis of HPV vaccination has generally assessed the costs to the health system, but this may not be as relevant to individual decision-making as potential lost income. We estimated the impact of bivalent HPV 16/18 vaccination on sick leave, unemployment, and premature mortality as a function of age and sex to understand the broader impact of HPV-related cancers. Methods We developed a population-level economic model to estimate lifetime income losses by diagnosis age, sex and cancer type. We applied sex- and cancer-specific Cox regressions to the Singapore Cancer Registry for annual predicted survival from 1992 to 2022. These were combined with census and employment data to estimate HPV-associated income losses in Singapore. Attributable fractions and vaccine effectiveness data for HPV 16/18 from the literature were used to estimate the effectiveness of bivalent HPV vaccination. Structural sensitivity analysis examined the role of 80% population coverage conferring herd immunity. Results The registry contained 17,294 individuals with an HPV-associated cancer diagnosis. Lost income was greatest for cervical cancer due to its high prevalence, however the losses per diagnosis were highest for oropharyngeal cancer. Bivalent HPV vaccination led to income benefits of $SGD1,397 [$895 to $1,838] in girls and -$62 [-$76 to -$48] in boys. A gender-neutral HPV vaccination of 80% of 15-year-old Singaporeans, conferring herd immunity, would have lifetime income protective benefits of $24.4m [$14.2m, $33.7m] per cohort, a five-fold return on investment. Conclusions In addition to avoiding healthcare costs and lost quality of life, parents should consider vaccination as a means of avoiding potential income losses. A national policy of gender-neutral HPV vaccination could deliver substantial income protection due to both individual vaccine protection and herd immunity.

16
Trade-offs in emergency transport protocols for access to hip fracture management: a geospatial analysis of selective versus standard transfer in Ontario long-term care

Yee, N. J.; Chen, T.; Huang, Y. Q.; Whyne, C.; Halai, M.

2026-04-14 orthopedics 10.64898/2026.04.12.26350713 medRxiv
Top 0.2%
10.4%
Show abstract

Objectives: For suspected hip fractures, prehospital protocols directing patients to an orthopaedic centre rather than the nearest emergency department (ED) could reduce time-to-surgery but may impact EMS travel burden. This study evaluates the impact of transfer protocols by quantifying transport to hospitals from long term care (LTC) facilities across Ontario. Methods: A retrospective cross-sectional analysis of all Ontario LTC facilities and hospitals was performed. Two protocols were modeled: standard transfer to the nearest ED with subsequent transfer if required, and selective transfer based on Collingwood Hip Fracture Rule prehospital screening1 directly to the nearest orthopaedic services (orthoED). Median one-way travel distances were calculated from Google Maps. Results: In Ontario, 15.4% of LTC residents require hospital destination decisions because their nearest ED lacks orthopaedic services; for these facilities, median distances were 2.7km to the ED and 36.0km to the orthoED. Among the 52 LTC facilities where selective transfer was distance-optimal, it substantially reduced travel for patients with hip fracture (31.1km vs 49.6km; P<.01) while only modestly increasing travel for patients without hip fracture. Where standard transfer was distance-optimal, little travel difference was noted for patients with hip fracture, however false positive screened patients traveled significantly further to an orthoED. Greatest negative consequences of selective transfer lie in the 1.3% of residents living farthest (>100km) from an orthoED. Conclusions: EMS direct transportation to hospitals with orthopaedics may improve hip fracture care but can increase EMS burden due to patients identified falsely as having a hip fracture, particularly in remote communities.

17
Methods of adjustment for public health and social measures in post-licensure vaccine studies in children in sub-Saharan Africa: a systematic review

Ndeketa, L.; Vaselli, N. M.; Pitzer, V. E.; Dodd, P. J.; Hungerford, D.; French, N.

2026-04-02 infectious diseases 10.64898/2026.04.01.26349767 medRxiv
Top 0.2%
10.3%
Show abstract

Background Post-licensure vaccine effectiveness and impact studies provide evidence on how vaccines perform under routine programme conditions in the real world. In sub-Saharan Africa (SSA), vaccine introductions frequently coincide with concurrent public health and social measures that may influence disease risk and transmission. Failure to account for these concurrent interventions may affect the interpretation of vaccine effects. Methods We conducted a systematic review of post-licensure vaccine effectiveness and impact studies conducted in children under five years of age in SSA. Electronic databases were searched for peer-reviewed studies published between January 2000 and December 2019. Eligible studies used observational designs to estimate vaccine effectiveness or population-level impact. Two reviewers independently screened studies, extracted data, and assessed methodological quality using Joanna Briggs Institute tools. We examined study designs, vaccines evaluated, outcomes assessed, and whether public health and social measures (PHSMs) were measured or adjusted for. A narrative synthesis was undertaken. In addition, we conducted a meta-analysis for rotavirus and pneumococcal conjugate vaccines where we explored the heterogeneity in individual-level effectiveness estimates where designs and outcomes were comparable. Results Sixty-four studies met the inclusion criteria, covering eight vaccine-preventable diseases. Rotavirus vaccines were most frequently evaluated, followed by pneumococcal conjugate vaccines. Case-control and ecological designs were most common, while cohort and time-series analyses were less frequently used. None of the included studies collected, reported, or adjusted for PHSMs such as nutrition, WASH, or access to healthcare. The implications of this omission varied by pathogen. Rotavirus vaccine effectiveness estimates from comparable individual-level designs were consistent across settings, with no evidence of between-study heterogeneity. Pneumococcal vaccine effectiveness estimates showed substantial heterogeneity, which appeared to reflect differences in outcome definitions, host risk profiles, and study context. Estimates for other vaccines were generally protective in direction, although the magnitude and precision varied across studies. Conclusions Post-licensure vaccine effectiveness and impact studies in SSA rarely account for concurrent PHSMs. The consequences of this omission are not uniform across vaccines. For some pathogens, effectiveness estimates appear robust to unmeasured contextual change, while for others they are highly sensitive to outcome choice and setting. Future evaluations should prioritise systematic measurement of key PHSMs and consider study designs that better account for time-varying context. Strengthening routine data systems to capture these factors is essential for generating interpretable evidence to inform immunisation policy.

18
Targeting a malaria merozoite surface protein with mRNA vaccine generates multifunctional antibodies

Thomas, A. A.; Runz, T.; Ho, T.; Fabb, S.; Lee, C. L.; Chishimba, S.; Mugan, R. S.; Reiling, L.; Kurtovic, L.; DSouza, C.; Pouton, C.; Beeson, J.

2026-03-29 immunology 10.64898/2026.03.26.714647 medRxiv
Top 0.2%
10.2%
Show abstract

IntroductionMalaria is a leading health problem with high disease burden and mortality rates worldwide. Currently approved vaccines target the sporozoite form of Plasmodium falciparum that initially infects the liver, but only provide modest protection against malaria in young children. There is an urgent need to develop next-generation malaria vaccines that target multiple parasite developmental stages for greater efficacy. Antibodies to merozoites, which are involved in blood-stage replication, and are associated with clinical illness, have multiple functional activities and can protect against malaria. A promising merozoite vaccine candidate is Merozoite Surface Protein 2 (PfMSP2). Antibodies to PfMSP2 can promote multiple antibody Fc-mediated functional activities to clear merozoites. MethodsWe developed and evaluated monovalent and bivalent (3D7 and FC27 variants) PfMSP2-based mRNA vaccines. We designed and codon-optimised mRNA, which was validated for in vitro expression in mammalian cells, and subsequently formulated as lipid nanoparticles for vaccination of mice in a 3-dose regimen. Vaccination with recombinant PfMSP2 protein with adjuvant was performed for comparison. We evaluated the induction of antibodies and functional activities relevant to protective immunity. ResultsmRNA vaccines induced prominent IgG responses using monovalent (3D7 allele) and bivalent (3D7 and FC27 alleles) vaccines encoding near full-length PfMSP2, and antibodies recognised the surface of whole merozoites. Vaccine responses were equivalent to, or superior than, a recombinant protein-based PfMSP2 vaccine. The bivalent vaccine induced equivalent antibodies to the two PfMSP2 alleles. Vaccination induced cytophilic IgG subclasses with multiple functional activities, including complement fixation, binding of human Fc{gamma}-receptors I and IIa, and opsonic phagocytosis. ConclusionsPfMSP2 is highly immunogenic using the mRNA vaccine platform and induces antibodies with multiple functional activities associated with protective immunity in humans. Combining PfMSP2 with other merozoite and sporozoite antigens is a promising strategy to develop highly efficacious vaccines to achieve malaria control and elimination goals.

19
The Metabolomic Signature of Stressful Life Events

Tian, Y.; Li-Gao, R.; Alshehri, T.; Brydges, C. R.; Arnold, M.; Mahmoudiandehkordi, S.; Kastenmuller, G.; Mook-Kanamori, D. O.; Rosendaal, F. R.; Giltay, E. J.; Xu, L.; Wang, J.; Jansen, R.; Bastiaanssen, T.; Penninx, B. W.; Kaddurah-Daouk, R.; Milaneschi, Y.

2026-04-04 epidemiology 10.64898/2026.04.02.26350045 medRxiv
Top 0.2%
10.1%
Show abstract

Stressful life events impact individual's functionality and contribute to disease outcomes, yet the biological pathways underlying life stress remain unclear. We characterized the metabolomic profiles of stressful life events using data from 3,264 participants (5,163 observations) of the Dutch NESDA cohort. 98 metabolites were identified, with upregulated metabolites overrepresented in phosphatidylethanolamine and downregulated metabolites overrepresented in fatty acid metabolism. 92 of these metabolites were available in the Dutch NEO cohort (N=599): 11 were significantly replicated including six lipids (e.g., three bile acids (glycochenodeoxycholate 3-sulfate)), one carbohydrate, and one xenobiotic. 21 overlapping metabolites were additionally available in the Chinese GBCS cohort (N=200): 10-undecenoate (11:1n1) (fatty acid) and glycochenodeoxycholate 3-sulfate (bile acid) showed consistent associations across both Dutch and Chinese cohorts. Stressful life events are associated with metabolic dysregulation, particularly involving fatty acid and bile acid pathways, highlighting promising biological targets to reduce the impact of stress on mental and somatic health.

20
Implementation of point-of-care screening for Chlamydia trachomatis, Neisseria gonorrhoeae, and Trichomonas vaginalis among pregnant women in South Africa: a mixed-methods process evaluation of the Philani Ndiphile trial

Shaetonhodi, N. G.; De Vos, L.; Babalola, C.; de Voux, A.; Joseph Davey, D.; Mdingi, M.; Peters, R. P. H.; Klausner, J. D.; Medina-Marino, A.

2026-04-13 public and global health 10.64898/2026.04.08.26350414 medRxiv
Top 0.2%
9.9%
Show abstract

BackgroundCurable sexually transmitted infections (STIs), including Chlamydia trachomatis, Neisseria gonorrhoeae, and Trichomonas vaginalis, remain highly prevalent among pregnant women in South Africa. Despite poor diagnostic performance in pregnancy, syndromic management remains standard care. Point-of-care (POC) screening enables aetiological diagnosis and same-visit treatment but is not yet included in national guidelines. We conducted a mixed-methods process evaluation to examine determinants of antenatal POC STI screening implementation in public facilities. MethodsThis evaluation was embedded within the three-arm Philani Ndiphile randomized trial (March 2021-February 2025) across four public clinics in the Eastern Cape. Screening used a near-POC, electricity-dependent nucleic acid amplification test with a 90-minute turnaround time. Reach, Adoption, Implementation, and Maintenance were assessed using the RE-AIM framework. Quantitative indicators included uptake of screening, treatment, and follow-up attendance. Qualitative data included in-depth interviews with 20 pregnant women and five focus group discussions with 21 research staff and government healthcare workers. The Consolidated Framework for Implementation Research guided qualitative analysis. Findings were integrated using narrative weaving. ResultsScreening uptake was high (99.0%), with treatment coverage of 95.2% at baseline and 93.5% at repeat screening. Same-day treatment was lower (50.7% and 69.8%) and varied substantially by facility, reflecting operational constraints including turnaround time, patient volume, infrastructure, and electricity. Attendance was higher when screening was integrated into routine ANC. Women valued screening for infant health, while providers recognised advantages over syndromic management but highlighted workforce, resource, and maintenance constraints. Socioeconomic factors, including transport costs, hunger, and work commitments, influenced retention and waiting. ConclusionsAntenatal POC STI screening was acceptable and achieved high treatment coverage in a research setting. However, same-day treatment was constrained by operational requirements of the testing platform. Scale-up will require workflow integration, strengthened health system capacity, and faster diagnostics suited to routine antenatal care. Key MessagesO_ST_ABSWhat is already known on this topicC_ST_ABSSyndromic management remains standard antenatal care in many low-resource settings despite failing to capture up to 89% of infections that remain asymptomatic. Point-of-care aetiological screening has demonstrated feasibility, acceptability, and potential clinical benefit in research settings, yet has not been widely adopted into national policy. Limited evidence exists on the health system requirements and contextual determinants influencing scale-up within routine public facilities. What this study addsThis mixed-methods process evaluation demonstrates high uptake and treatment coverage of antenatal POC STI screening in a trial setting, while identifying facility-level, structural, and socioeconomic factors shaping same-day treatment and retention. We show that implementation success varies substantially across clinics and depends on assay characteristics, workflow integration, human resources, infrastructure reliability, and follow-up capacity. How this study might affect research, practice or policyThese findings provide implementation-relevant evidence to inform national policy deliberations on integrating POC STI screening into antenatal care. Sustainable scale-up will require context-adapted delivery models, strengthened workforce and supply systems, faster diagnostics, and alignment with existing ANC workflows to ensure equitable and durable impact.